68 research outputs found

    An Inherently Quantum Computation Paradigm: NP-complete=P Under the Hypothetical Notion of Continuous Uncomplete von Neumann Measurement

    Full text link
    The topical quantum computation paradigm is a transposition of the Turing machine into the quantum framework. Implementations based on this paradigm have limitations as to the number of: qubits, computation steps, efficient quantum algorithms (found so far). A new exclusively quantum paradigm (with no classical counterpart) is propounded, based on the speculative notion of continuous uncomplete von Neumann measurement. Under such a notion, NP-complete is equal to P. This can provide a mathematical framework for the search of implementable paradigms, possibly exploiting particle statistics.Comment: 1 figure. From the Quantum Computation and Communication Pathfinder Meeting in Helsinki (September 26-28, 1998) - extended versio

    Completing the physical representation of quantum algorithms provides a quantitative explanation of their computational speedup

    Full text link
    The usual representation of quantum algorithms, limited to the process of solving the problem, is physically incomplete. We complete it in three steps: (i) extending the representation to the process of setting the problem, (ii) relativizing the extended representation to the problem solver to whom the problem setting must be concealed, and (iii) symmetrizing the relativized representation for time reversal to represent the reversibility of the underlying physical process. The third steps projects the input state of the relativized representation, where the problem solver is completely ignorant of the setting and thus the solution of the problem, on one where she knows half solution (half of the information specifying it when the solution is an unstructured bit string). Completing the physical representation shows that the number of computation steps (oracle queries) required to solve any oracle problem in an optimal quantum way should be that of a classical algorithm endowed with the advanced knowledge of half solution. This fits the major quantum algorithms known today and would solve the quantum query complexity problem.Comment: Explicitly addressed the controversial character of the work in an extended discussion,24 page

    Discussing the explanation of the quantum speed up

    Full text link
    In former work, we showed that a quantum algorithm is the sum over the histories of a classical algorithm that knows in advance 50% of the information about the solution of the problem - each history is a possible way of getting the advanced information and a possible result of computing the missing information. We gave a theoretical justification of this 50% advanced information rule and checked that it holds for a large variety of quantum algorithms. Now we discuss the theoretical justification in further detail and counter a possible objection. We show that the rule is the generalization of a simple, well known, explanation of quantum nonlocality - where logical correlation between measurement outcomes is physically backed by a causal/deterministic/local process with causality allowed to go backward in time with backdated state vector reduction. The possible objection is that quantum algorithms often produce the solution of the problem in an apparently deterministic way (when their unitary part produces an eigenstate of the observable to be measured and measurement produces the corresponding eigenvalue - the solution - with probability 1), while the present explanation of the speed up relies on the nondeterministic character of quantum measurement. We show that this objection would mistake the nondeterministic production of a definite outcome for a deterministic production.Comment: 6 pages,changed content: explanations of quantum speed up and nonlocality related with one anothe

    Non-Mechanism in Quantum Oracle Computing

    Full text link
    A typical oracle problem is finding which software program is installed on a computer, by running the computer and testing its input-output behaviour. The program is randomly chosen from a set of programs known to the problem solver. As well known, some oracle problems are solved more efficiently by using quantum algorithms; this naturally implies changing the computer to quantum, while the choice of the software program remains sharp. In order to highlight the non-mechanistic origin of this higher efficiency, also the uncertainty about which program is installed must be represented in a quantum way.Comment: 9 text pages, 3 figures in one additional ps file, manuscript of the presentation to be held at the SILFS Workshop on Logic and Quantum Computation, Cesena, Italy, February 16, 199

    Origin of the quantum speed-up

    Full text link
    Bob chooses a function from a set of functions and gives Alice the black box that computes it. Alice is to find a characteristic of the function through function evaluations. In the quantum case, the number of function evaluations can be smaller than the minimum classically possible. The fundamental reason for this violation of a classical limit is not known. We trace it back to a disambiguation of the principle that measuring an observable determines one of its eigenvalues. Representing Bob's choice of the label of the function as the unitary transformation of a random quantum measurement outcome shows that: (i) finding the characteristic of the function on the part of Alice is a by-product of reconstructing Bob's choice and (ii) because of the quantum correlation between choice and reconstruction, one cannot tell whether Bob's choice is determined by the action of Bob (initial measurement and successive unitary transformation) or that of Alice (further unitary transformation and final measurement). Postulating that the determination shares evenly between the two actions, in a uniform superposition of all the possible ways of sharing, implies that quantum algorithms are superpositions of histories in each of which Alice knows in advance one of the possible halves of Bob's choice. Performing, in each history, only the function evaluations required to classically reconstruct Bob's choice given the advanced knowledge of half of it yields the quantum speed-up. In all the cases examined, this goes along with interleaving function evaluations with non-computational unitary transformations that each time maximize the amount of information about Bob's choice acquired by Alice with function evaluation.Comment: 21 pages, 1 figure. Corrected a misleading typing error at point III) of Section 3: "Do the same with B" replaced by "Do the same with V"; minor distributed text improvements. arXiv admin note: text overlap with arXiv:1101.435

    An exact relation between number of black box computations required to solve an oracle problem quantumly and quantum retrocausality

    Full text link
    We investigate the reason for the quantum speedup -- quantum algorithms requiring fewer computation steps than their classical counterparts. We extend their representation to the process of setting the problem. The initial measurement selects a setting at random, Bob (the problem setter) unitarily changes it into the desired one. This representation is to Bob and any external observer, it cannot be to Alice (the problem solver). It would tell her the function computed by the black box, which to her should be hidden inside it. We resort to relational quantum mechanics. To Alice, the projection of the quantum state due to the initial measurement is retarded at the end of her problem solving action. To her, the algorithm input state remains one of complete ignorance of the setting. By black box computations, she unitarily sends it into the output state that, for each possible setting, encodes the corresponding solution, acquired by the final measurement. We show that we can ascribe to the final measurement the selection of any part -- say the R-th part -- of the random outcome of the initial measurement. This projects the input state to Alice on a state of lower entropy where she knows a corresponding part of the problem setting. The quantum algorithm is a sum over classical histories in each of which Alice, knowing in advance one of the R-th parts of the setting, performs the black box computations still required to identify the solution. Given an oracle problem and a value of R, this retrocausality model provides the number of black box computations required to solve it. Conversely, given a known quantum algorithm, it yields the value of R that explains its speed up. R = 1/2 always yields the number of black box computations required by an existing quantum algorithm and the order of magnitude of the number required by optimal one.Comment: Added a short history of the birth of quantum computation, a positioning of the work, and various clarifications, 29 pages. arXiv admin note: text overlap with arXiv:1308.507

    Parallel Quantum Computation, the Library of Babel and Quantum Measurement as the Efficient Librarian

    Get PDF
    The complementary roles played by parallel quantum computation and quantum measurement in originating the quantum speed-up are illustrated through an analogy with a famous metaphor by J.L. Borges.Comment: 2 pages, RevTex, no figures. An excerpt from a longer (referenced) work, with a justification of the quantum speed-u

    Quantum problem solving as simultaneous computation

    Full text link
    I provide an alternative way of seeing quantum computation. First, I describe an idealized classical problem solving machine that, thanks to a many body interaction, reversibly and nondeterministically produces the solution of the problem under the simultaneous influence of all the problem constraints. This requires a perfectly accurate, rigid, and reversible relation between the coordinates of the machine parts - the machine can be considered the many body generalization of another perfect machine, the bounching ball model of reversible computation. The mathematical description of the machine, as it is, is applicable to quantum problem solving, an extension of the quantum algorithms that comprises the physical representation of the problem-solution interdependence. The perfect relation between the coordinates of the machine parts is transferred to the populations of the reduced density operators of the parts of the computer register. The solution of the problem is reversibly and nondeterministically produced under the simultaneous influence of the state before measurement and the quantum principle. At the light of the present notion of simultaneous computation, the quantum speed up turns out to be "precognition" of the solution, namely the reduction of the initial ignorance of the solution due to backdating, to before running the algorithm, a time-symmetric part of the state vector reduction on the solution; as such, it is bounded by state vector reduction through an entropic inequality. PACS numbers: 03.67.Lx, 01.55.+b, 01.70.+wComment: 12 pages, part of a work to be published on IJT

    An explanation of the quantum speed up

    Full text link
    In former work, we showed that a quantum algorithm requires the number of operations (oracle's queries) of a classical algorithm that knows in advance 50% of the information that specifies the solution of the problem. We gave a preliminary theoretical justification of this "50% rule" and checked that the rule holds for a variety of quantum algorithms. Now, we make explicit the information about the solution available to the algorithm throughout the computation. The final projection on the solution becomes acquisition of the knowledge of the solution on the part of the algorithm. Backdating to before running the algorithm a time-symmetric part of this projection, feeds back to the input of the computation 50% of the information acquired by reading the solution.Comment: 15 pages, resubmitted to Phys Rev A, the 50% rule is now related to backdating to before running the algorithm a time-symmetric part of the final projection on the solution

    A Quantum Logic Gate Representation of Quantum Measurement: Reversing and Unifying the Two Steps of von Neumann's Model

    Get PDF
    In former work, quantum computation has been shown to be a problem solving process essentially affected by both the reversible dynamics leading to the state before measurement, and the logical-mathematical constraints introduced by quantum measurement (in particular, the constraint that there is only one measurement outcome). This dual influence, originated by independent initial and final conditions, justifies the quantum computation speed-up and is not representable inside dynamics, namely as a one-way propagation. In this work, we reformulate von Neumann's model of quantum measurement at the light of above findings. We embed it in a broader representation based on the quantum logic gate formalism and capable of describing the interplay between dynamical and non-dynamical constraints. The two steps of the original model, namely (1) dynamically reaching a complete entanglement between pointer and quantum object and (2) enforcing the one-outcome-constraint, are unified and reversed. By representing step (2) right from the start, the same dynamics of step (1) yields a probability distribution of mutually exclusive measurement outcomes. This appears to be a more accurate and complete representation of quantum measurement. PACS: 03.67.-a, 03.67.Lx, 03.65.BzComment: 17 pages, RevTex, 1 PostScript file with figure, submitted to Int. J. Theor. Phy
    corecore